172 research outputs found

    Citizen participation and awareness raising in coastal protected areas. A case study from Italy

    Get PDF
    In this chapter, part of the research carried out within the SECOA project (www.projectsecoa.eu) is presented. Attention is devoted to methods and tools used for supporting the participatory process in a case of environmental conflict related to the definition of boundaries of a coastal protected area: the Costa Teatina National Park, in Abruzzo, central Italy. The Costa Teatina National Park was established by the National Law 93/2001. Its territory includes eight southern Abruzzo municipalities and covers a stretch of coastline of approximately 60 km. It is a coastal protected area, which incorporates land but not sea, characterized by the presence of important cultural and natural assets. The Italian Ministry of Environment (1998) defines the area as “winding and varied, with the alternation of sandy and gravel beaches, cliffs, river mouths, areas rich in indigenous vegetation and cultivated lands (mainly olives), dunes and forest trees”. The park boundaries were not defined by the law that set it up, and their determination has been postponed to a later stage of territorial negotiation that has not ended yet (Montanari and Staniscia, 2013). The definition of the park boundaries, indeed, has resulted in an intense debate between citizens and interest groups who believe that environmental protection does not conflict with economic growth and those who believe the opposite. That is why the process is still in act and a solution is far from being reached. In this chapter, the methodology and the tools used to involve the general public in active participation in decision making and to support institutional players in conflict mitigation will be presented. Those tools have also proven to be effective in the dissemination of information and transfer of knowledge. Results obtained through the use of each instrument will not be presented here since this falls outside the purpose of the present essay. The chapter is organized as follows: in the first section the importance of the theme of citizen participation in decision making will be highlighted; the focus will be on participation in the processes of ICZM, relevant to the management of coastal protected areas. In the second section a review of the most commonly used methods in social research is presented; advantages and disadvantages of each of them will be highlighted. In particular, the history and the evolution of the Delphi method and its derivatives are discussed; focus will be on the dissemination value of the logic underlying such iterative methods. In the third section the tools used in the case of the Costa Teatina National Park will be presented; strengths and weaknesses will be highlighted and proposals for their improvement will be advanced. Discussion and conclusions follow

    Robust multi-objective optimization of safety barriers performance parameters for NaTech scenarios risk assessment and management

    Get PDF
    Safety barriers are to be designed to bring the largest benefit in terms of accidental scenarios consequences mitigation at the most reasonable cost. In this paper, we formulate the problem of the identification of the optimal performance parameters of the barriers that can at the same time allow for the consequences mitigation of Natural Technological (NaTech) accidental scenarios at reasonable cost as a Multi-Objective Optimization (MOO) problem. The MOO is solved for a case study of literature, consisting in a chemical facility composed by three tanks filled with flammable substances and equipped with six safety barriers (active, passive and procedural), exposed to NaTech scenarios triggered by either severe floods or earthquakes. The performance of the barriers is evaluated by a phenomenological dynamic model that mimics the realistic response of the system. The uncertainty of the relevant parameters of the model (i.e., the response time of active and procedural barriers and the effectiveness of the barriers) is accounted for in the optimization, to provide robust solutions. Results for this case study suggest that the NaTech risk is optimally managed by improving the performances of four-out-of-six barriers (three active and one passive). Practical guidelines are provided to retrofit the safety barriers design

    Robust weighted aggregation of expert opinions in futures studies

    Get PDF
    Expert judgments are widespread in many fields, and the way in which they are collected and the procedure by which they are aggregated are considered crucial steps. From a statistical perspective, expert judgments are subjective data and must be gathered and treated as carefully and scientifically as possible. In the elicitation phase, a multitude of experts is preferable to a single expert, and techniques based on anonymity and iterations, such as Delphi, offer many advantages in terms of reducing distortions, which are mainly related to cognitive biases. There are two approaches to the aggregation of the judgments given by a panel of experts, referred to as behavioural (implying an interaction between the experts) and mathematical (involving non-interacting participants and the aggregation of the judgments using a mathematical formula). Both have advantages and disadvantages, and with the mathematical approach, the main problem concerns the subjective choice of an appropriate formula for both normalization and aggregation. We propose a new method for aggregating and processing subjective data collected using the Delphi method, with the aim of obtaining robust rankings of the outputs. This method makes it possible to normalize and aggregate the opinions of a panel of experts, while modelling different sources of uncertainty. We use an uncertainty analysis approach that allows the contemporaneous use of different aggregation and normalization functions, so that the result does not depend on the choice of a specific mathematical formula, thereby solving the problem of choice. Furthermore, we can also model the uncertainty related to the weighting system, which reflects the different expertise of the participants as well as expert opinion accuracy. By combining the Delphi method with the robust ranking procedure, we offer a new protocol covering the elicitation, the aggregation and the processing of subjective data used in the construction of Delphi-based future scenarios. The method is very flexible and can be applied to the aggregation and processing of any subjective judgments, i.e. also those outside the context of futures studies. Finally, we show the validity, reproducibility and potential of the method through its application with regard to the future of Italian families

    Ensembles of climate change models for risk assessment of nuclear power plants

    Get PDF
    Climate change affects technical Systems, Structures and Infrastructures (SSIs), changing the environmental context for which SSI were originally designed. In order to prevent any risk growth beyond acceptable levels, the climate change effects must be accounted for into risk assessment models. Climate models can provide future climate data, such as air temperature and pressure. However, the reliability of climate models is a major concern due to the uncertainty in the temperature and pressure future projections. In this work, we consider five climate change models (individually unable to accurately provide historical recorded temperatures and, thus, also future projections), and ensemble their projections for integration in a probabilistic safety assessment, conditional on climate projections. As case study, we consider the Passive Containment Cooling System (PCCS) of two AP1000 Nuclear Power Plants (NPPs). Results provided by the different ensembles are compared. Finally, a risk-based classification approach is performed to identify critical future temperatures, which may lead to PCCS risks beyond acceptable levels

    Multi-source statistics:Basic situations and methods

    Get PDF
    Many National Statistical Institutes (NSIs), especially in Europe, are moving from single‐source statistics to multi‐source statistics. By combining data sources, NSIs can produce more detailed and more timely statistics and respond more quickly to events in society. By combining survey data with already available administrative data and Big Data, NSIs can save data collection and processing costs and reduce the burden on respondents. However, multi‐source statistics come with new problems that need to be overcome before the resulting output quality is sufficiently high and before those statistics can be produced efficiently. What complicates the production of multi‐source statistics is that they come in many different varieties as data sets can be combined in many different ways. Given the rapidly increasing importance of producing multi‐source statistics in Official Statistics, there has been considerable research activity in this area over the last few years, and some frameworks have been developed for multi‐source statistics. Useful as these frameworks are, they generally do not give guidelines to which method could be applied in a certain situation arising in practice. In this paper, we aim to fill that gap, structure the world of multi‐source statistics and its problems and provide some guidance to suitable methods for these problems

    Personalizing Cancer Pain Therapy: Insights from the Rational Use of Analgesics (RUA) Group

    Get PDF
    Introduction: A previous Delphi survey from the Rational Use of Analgesics (RUA) project involving Italian palliative care specialists revealed some discrepancies between current guidelines and clinical practice with a lack of consensus on items regarding the use of strong opioids in treating cancer pain. Those results represented the basis for a new Delphi study addressing a better approach to pain treatment in patients with cancer. Methods: The study consisted of a two-round multidisciplinary Delphi study. Specialists rated their agreement with a set of 17 statements using a 5-point Likert scale (0 = totally disagree and 4 = totally agree). Consensus on a statement was achieved if the median consensus score (MCS) (expressed as value at which at least 50% of participants agreed) was at least 4 and the interquartile range (IQR) was 3–4. Results: This survey included input from 186 palliative care specialists representing all Italian territory. Consensus was reached on seven statements. More than 70% of participants agreed with the use of low dose of strong opioids in moderate pain treatment and valued transdermal route as an effective option when the oral route is not available. There was strong consensus on the importance of knowing opioid pharmacokinetics for therapy personalization and on identifying immediate-release opioids as key for tailoring therapy to patients’ needs. Limited agreement was reached on items regarding breakthrough pain and the management of opioid-induced bowel dysfunction. Conclusion: These findings may assist clinicians in applying clinical evidence to routine care settings and call for a reappraisal of current pain treatment recommendations with the final aim of optimizing the clinical use of strong opioids in patients with cancer

    Railway bridge structural health monitoring and fault detection: state-of-the-art methods and future challenges

    Get PDF
    Railway importance in the transportation industry is increasing continuously, due to the growing demand of both passenger travel and transportation of goods. However, more than 35% of the 300,000 railway bridges across Europe are over 100-years old, and their reliability directly impacts the reliability of the railway network. This increased demand may lead to higher risk associated with their unexpected failures, resulting safety hazards to passengers and increased whole life cycle cost of the asset. Consequently, one of the most important aspects of evaluation of the reliability of the overall railway transport system is bridge structural health monitoring, which can monitor the health state of the bridge by allowing an early detection of failures. Therefore, a fast, safe and cost-effective recovery of the optimal health state of the bridge, where the levels of element degradation or failure are maintained efficiently, can be achieved. In this article, after an introduction to the desired features of structural health monitoring, a review of the most commonly adopted bridge fault detection methods is presented. Mainly, the analysis focuses on model-based finite element updating strategies, non-model-based (data-driven) fault detection methods, such as artificial neural network, and Bayesian belief network–based structural health monitoring methods. A comparative study, which aims to discuss and compare the performance of the reviewed types of structural health monitoring methods, is then presented by analysing a short-span steel structure of a railway bridge. Opportunities and future challenges of the fault detection methods of railway bridges are highlighted

    Guidelines for the use and interpretation of assays for monitoring autophagy (4th edition)

    Get PDF

    On the mechanisms governing gas penetration into a tokamak plasma during a massive gas injection

    Get PDF
    A new 1D radial fluid code, IMAGINE, is used to simulate the penetration of gas into a tokamak plasma during a massive gas injection (MGI). The main result is that the gas is in general strongly braked as it reaches the plasma, due to mechanisms related to charge exchange and (to a smaller extent) recombination. As a result, only a fraction of the gas penetrates into the plasma. Also, a shock wave is created in the gas which propagates away from the plasma, braking and compressing the incoming gas. Simulation results are quantitatively consistent, at least in terms of orders of magnitude, with experimental data for a D 2 MGI into a JET Ohmic plasma. Simulations of MGI into the background plasma surrounding a runaway electron beam show that if the background electron density is too high, the gas may not penetrate, suggesting a possible explanation for the recent results of Reux et al in JET (2015 Nucl. Fusion 55 093013)

    Guidelines for the use and interpretation of assays for monitoring autophagy (4th edition)1.

    Get PDF
    In 2008, we published the first set of guidelines for standardizing research in autophagy. Since then, this topic has received increasing attention, and many scientists have entered the field. Our knowledge base and relevant new technologies have also been expanding. Thus, it is important to formulate on a regular basis updated guidelines for monitoring autophagy in different organisms. Despite numerous reviews, there continues to be confusion regarding acceptable methods to evaluate autophagy, especially in multicellular eukaryotes. Here, we present a set of guidelines for investigators to select and interpret methods to examine autophagy and related processes, and for reviewers to provide realistic and reasonable critiques of reports that are focused on these processes. These guidelines are not meant to be a dogmatic set of rules, because the appropriateness of any assay largely depends on the question being asked and the system being used. Moreover, no individual assay is perfect for every situation, calling for the use of multiple techniques to properly monitor autophagy in each experimental setting. Finally, several core components of the autophagy machinery have been implicated in distinct autophagic processes (canonical and noncanonical autophagy), implying that genetic approaches to block autophagy should rely on targeting two or more autophagy-related genes that ideally participate in distinct steps of the pathway. Along similar lines, because multiple proteins involved in autophagy also regulate other cellular pathways including apoptosis, not all of them can be used as a specific marker for bona fide autophagic responses. Here, we critically discuss current methods of assessing autophagy and the information they can, or cannot, provide. Our ultimate goal is to encourage intellectual and technical innovation in the field
    • 

    corecore